Remote Source Coding under Gaussian Noise : Dueling Roles of Power and Entropy-Power

نویسندگان

  • Krishnan Eswaran
  • Michael Gastpar
چکیده

Lossy source coding under the mean-squared error fidelity criterion is considered. The rate-distortion function can be expressed in closed form only for very special cases, including Gaussian sources. The classical upper and lower bounds look exactly alike, except that the upper bound has the source power (variance) whereas the lower bound has the source entropypower. This pleasing duality of power and entropy-power extends to the case of remote source coding, i.e., the case where the encoder only gets to observe the source through a noisy channel. Bounds are presented both for the centralized and for the distributed case, often referred to as the CEO problem.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Information content in Gaussian noise: optimal compression rates

We approach the theoretical problem of compressing a signal dominated by gaussian noise. We present accurate expressions for the compression ratio which can be reached under the light of Shannon’s noiseless coding theorem, for a linearly quantized stochastic gaussian signal (noise). The compression ratio decreases logarithmically with the amplitude of the frequency spectrum of the noise P (f). ...

متن کامل

A Hybrid Cooperative Coding Scheme for the Relay-Eavesdropper Channel

This paper considers the four-node relay-eavesdropper channel, where a relay node helps the source to send secret messages to the destination in the presence of a passive eavesdropper. For the discrete memoryless case, we propose a hybrid cooperative coding scheme, which is based on the combination of the partial decode-forward scheme, the noise-forward scheme and the random binning scheme. The...

متن کامل

Speech Enhancement Using Gaussian Mixture Models, Explicit Bayesian Estimation and Wiener Filtering

Gaussian Mixture Models (GMMs) of power spectral densities of speech and noise are used with explicit Bayesian estimations in Wiener filtering of noisy speech. No assumption is made on the nature or stationarity of the noise. No voice activity detection (VAD) or any other means is employed to estimate the input SNR. The GMM mean vectors are used to form sets of over-determined system of equatio...

متن کامل

1 N ov 1 99 9 Information content in uniformly discretized Gaussian noise : optimal compres - sion rates

We approach the theoretical problem of compressing a signal dominated by Gaussian noise. We present expressions for the compression ratio which can be reached, under the light of Shannon’s noiseless coding theorem, for a linearly quantized stochastic Gaussian signal (noise). The compression ratio decreases logarithmically with the amplitude of the frequency spectrum P (f) of the noise. Entropy ...

متن کامل

An Alternative Proof of an Extremal Entropy Inequality

This paper first focuses on deriving an alternative approach for proving an extremal entropy inequality (EEI), originally presented in [11]. The proposed approach does not rely on the channel enhancement technique, and has the advantage that it yields an explicit description of the optimal solution as opposed to the implicit approach of [11]. Compared with the proofs in [11], the proposed alter...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2018